41 research outputs found

    Neural correlates of enhanced visual short-term memory for angry faces: An fMRI study

    Get PDF
    Copyright: © 2008 Jackson et al.Background: Fluid and effective social communication requires that both face identity and emotional expression information are encoded and maintained in visual short-term memory (VSTM) to enable a coherent, ongoing picture of the world and its players. This appears to be of particular evolutionary importance when confronted with potentially threatening displays of emotion - previous research has shown better VSTM for angry versus happy or neutral face identities.Methodology/Principal Findings: Using functional magnetic resonance imaging, here we investigated the neural correlates of this angry face benefit in VSTM. Participants were shown between one and four to-be-remembered angry, happy, or neutral faces, and after a short retention delay they stated whether a single probe face had been present or not in the previous display. All faces in any one display expressed the same emotion, and the task required memory for face identity. We find enhanced VSTM for angry face identities and describe the right hemisphere brain network underpinning this effect, which involves the globus pallidus, superior temporal sulcus, and frontal lobe. Increased activity in the globus pallidus was significantly correlated with the angry benefit in VSTM. Areas modulated by emotion were distinct from those modulated by memory load.Conclusions/Significance: Our results provide evidence for a key role of the basal ganglia as an interface between emotion and cognition, supported by a frontal, temporal, and occipital network.The authors were supported by a Wellcome Trust grant (grant number 077185/Z/05/Z) and by BBSRC (UK) grant BBS/B/16178

    Emotion based attentional priority for storage in visual short-term memory

    Get PDF
    A plethora of research demonstrates that the processing of emotional faces is prioritised over non-emotive stimuli when cognitive resources are limited (this is known as ‘emotional superiority’). However, there is debate as to whether competition for processing resources results in emotional superiority per se, or more specifically, threat superiority. Therefore, to investigate prioritisation of emotional stimuli for storage in visual short-term memory (VSTM), we devised an original VSTM report procedure using schematic (angry, happy, neutral) faces in which processing competition was manipulated. In Experiment 1, display exposure time was manipulated to create competition between stimuli. Participants (n = 20) had to recall a probed stimulus from a set size of four under high (150 ms array exposure duration) and low (400 ms array exposure duration) perceptual processing competition. For the high competition condition (i.e. 150 ms exposure), results revealed an emotional superiority effect per se. In Experiment 2 (n = 20), we increased competition by manipulating set size (three versus five stimuli), whilst maintaining a constrained array exposure duration of 150 ms. Here, for the five-stimulus set size (i.e. maximal competition) only threat superiority emerged. These findings demonstrate attentional prioritisation for storage in VSTM for emotional faces. We argue that task demands modulated the availability of processing resources and consequently the relative magnitude of the emotional/threat superiority effect, with only threatening stimuli prioritised for storage in VSTM under more demanding processing conditions. Our results are discussed in light of models and theories of visual selection, and not only combine the two strands of research (i.e. visual selection and emotion), but highlight a critical factor in the processing of emotional stimuli is availability of processing resources, which is further constrained by task demands

    Seeing Emotion with Your Ears: Emotional Prosody Implicitly Guides Visual Attention to Faces

    Get PDF
    Interpersonal communication involves the processing of multimodal emotional cues, particularly facial expressions (visual modality) and emotional speech prosody (auditory modality) which can interact during information processing. Here, we investigated whether the implicit processing of emotional prosody systematically influences gaze behavior to facial expressions of emotion. We analyzed the eye movements of 31 participants as they scanned a visual array of four emotional faces portraying fear, anger, happiness, and neutrality, while listening to an emotionally-inflected pseudo-utterance (Someone migged the pazing) uttered in a congruent or incongruent tone. Participants heard the emotional utterance during the first 1250 milliseconds of a five-second visual array and then performed an immediate recall decision about the face they had just seen. The frequency and duration of first saccades and of total looks in three temporal windows ([0–1250 ms], [1250–2500 ms], [2500–5000 ms]) were analyzed according to the emotional content of faces and voices. Results showed that participants looked longer and more frequently at faces that matched the prosody in all three time windows (emotion congruency effect), although this effect was often emotion-specific (with greatest effects for fear). Effects of prosody on visual attention to faces persisted over time and could be detected long after the auditory information was no longer present. These data imply that emotional prosody is processed automatically during communication and that these cues play a critical role in how humans respond to related visual cues in the environment, such as facial expressions

    The dot-probe task to measure emotional attention: A suitable measure in comparative studies?

    Get PDF
    corecore